Machine Learning using Hyperkernels
نویسندگان
چکیده
We expand on the problem of learning a kernel via a RKHS on the space of kernels itself. The resulting optimization problem is shown to have a semidefinite programming solution. We demonstrate that it is possible to learn the kernel for various formulations of machine learning problems. Specifically, we provide mathematical programming formulations and experimental results for the CSVM, ν-SVM and Lagrangian SVM for classification on UCI data, and novelty detection.
منابع مشابه
Learning Nearest-Neighbor Classifiers with Hyperkernels
We consider improving the performance of k-Nearest Neighbor classifiers. A regularized kNN is proposed to learn an optimal dissimilarity function to substitute the Euclidean metric. The learning process employs hyperkernels and shares a similar regularization framework as support vector machines (SVM). Its performance is shown to be consistently better than kNN, and is competitive with SVM.
متن کاملLearning the Kernel with Hyperkernels
This paper addresses the problem of choosing a kernel suitable for estimation with a Support Vector Machine, hence further automating machine learning. This goal is achieved by defining a Reproducing Kernel Hilbert Space on the space of kernels itself. Such a formulation leads to a statistical estimation problem very much akin to the problem of minimizing a regularized risk functional. We state...
متن کاملHyperkernel Based Density Estimation
We focus on solving the problem of learning an optimal smoothing kernel for the unsupervised learning problem of kernel density estimation(KDE) by using hyperkernels. The optimal kernel is the one which minimizes the regularized negative leave-one-out-log likelihood score of the train set. We demonstrate that ”fixed bandwidth” and ”variable bandwidth” KDE are special cases of our algorithm.
متن کاملGaussian and Wishart Hyperkernels
We propose a new method for constructing hyperkenels and define two promising special cases that can be computed in closed form. These we call the Gaussian and Wishart hyperkernels. The former is especially attractive in that it has an interpretable regularization scheme reminiscent of that of the Gaussian RBF kernel. We discuss how kernel learning can be used not just for improving the perform...
متن کاملHyperkernels
We consider the problem of choosing a kernel suitable for estimation using a Gaussian Process estimator or a Support Vector Machine. A novel solution is presented which involves defining a Reproducing Kernel Hilbert Space on the space of kernels itself. By utilizing an analog of the classical representer theorem, the problem of choosing a kernel from a parameterized family of kernels (e.g. of v...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003